Fault detection for geological drilling processes using multivariate generalized Gaussian distribution and Kullback Leibler divergence
نویسندگان
چکیده
منابع مشابه
Gaussian Mixture Reduction Using Reverse Kullback-Leibler Divergence
We propose a greedy mixture reduction algorithm which is capable of pruning mixture components as well as merging them based on the Kullback-Leibler divergence (KLD). The algorithm is distinct from the well-known Runnalls’ KLD based method since it is not restricted to merging operations. The capability of pruning (in addition to merging) gives the algorithm the ability of preserving the peaks ...
متن کاملKullback-Leibler Divergence Measure for Multivariate Skew-Normal Distributions
The aim of this work is to provide the tools to compute the well-known Kullback–Leibler divergence measure for the flexible family of multivariate skew-normal distributions. In particular, we use the Jeffreys divergence measure to compare the multivariate normal distribution with the skew-multivariate normal distribution, showing that this is equivalent to comparing univariate versions of these...
متن کاملKullback-Leibler Divergence for the Normal-Gamma Distribution
We derive the Kullback-Leibler divergence for the normal-gamma distribution and show that it is identical to the Bayesian complexity penalty for the univariate general linear model with conjugate priors. Based on this finding, we provide two applications of the KL divergence, one in simulated and one in empirical data.
متن کاملGaussian Approximations of Small Noise Diffusions in Kullback-leibler Divergence
Abstract. We study Gaussian approximations to the distribution of a diffusion. The approximations are easy to compute: they are defined by two simple ordinary differential equations for the mean and the covariance. Time correlations can also be computed via solution of a linear stochastic differential equation. We show, using the Kullback-Leibler divergence, that the approximations are accurate...
متن کاملRényi Divergence and Kullback-Leibler Divergence
Rényi divergence is related to Rényi entropy much like Kullback-Leibler divergence is related to Shannon’s entropy, and comes up in many settings. It was introduced by Rényi as a measure of information that satisfies almost the same axioms as Kullback-Leibler divergence, and depends on a parameter that is called its order. In particular, the Rényi divergence of order 1 equals the Kullback-Leibl...
متن کاملذخیره در منابع من
با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید
ژورنال
عنوان ژورنال: IFAC-PapersOnLine
سال: 2020
ISSN: 2405-8963
DOI: 10.1016/j.ifacol.2020.12.115